Approximate inversion of the wave-equation Hessian via randomized matrix probing
نویسنده
چکیده
We present a method for approximately inverting the Hessian of full waveform inversion as a dip-dependent and scaledependent amplitude correction. The terms in the expansion of this correction are determined by least-squares fitting from a handful of applications of the Hessian to random models — a procedure called matrix probing. We show numerical indications that randomness is important for generating a robust preconditioner, i.e., one that works regardless of the model to be corrected. To be successful, matrix probing requires an accurate determination of the nullspace of the Hessian, which we propose to implement as a local dip-dependent mask in curvelet space. Numerical experiments show that the novel preconditioner fits 70% of the inverse Hessian (in Frobenius norm) for the 1-parameter acoustic 2D Marmousi model.
منابع مشابه
Matrix probing: a randomized preconditioner for the wave-equation Hessian
This paper considers the problem of approximating the inverse of the wave-equation Hessian, also called normal operator, in seismology and other types of wave-based imaging. An expansion scheme for the pseudodifferential symbol of the inverse Hessian is set up. The coefficients in this expansion are found via least-squares fitting from a certain number of applications of the normal operator on ...
متن کاملEdge Detection with Hessian Matrix Property Based on Wavelet Transform
In this paper, we present an edge detection method based on wavelet transform and Hessian matrix of image at each pixel. Many methods which based on wavelet transform, use wavelet transform to approximate the gradient of image and detect edges by searching the modulus maximum of gradient vectors. In our scheme, we use wavelet transform to approximate Hessian matrix of image at each pixel, too. ...
متن کاملUsing an Efficient Penalty Method for Solving Linear Least Square Problem with Nonlinear Constraints
In this paper, we use a penalty method for solving the linear least squares problem with nonlinear constraints. In each iteration of penalty methods for solving the problem, the calculation of projected Hessian matrix is required. Given that the objective function is linear least squares, projected Hessian matrix of the penalty function consists of two parts that the exact amount of a part of i...
متن کاملMatrix Probing and its Conditioning
When a matrix A with n columns is known to be well-approximated by a linear combination of basis matrices B1, . . . , Bp, we can apply A to a random vector and solve a linear system to recover this linear combination. The same technique can be used to obtain an approximation to A−1. A basic question is whether this linear system is well-conditioned. This is important for two reasons: a well-con...
متن کاملLow Complexity Damped Gauss-Newton Algorithms for CANDECOMP/PARAFAC
The damped Gauss-Newton (dGN) algorithm for CANDECOMP/PARAFAC (CP) decomposition can handle the challenges of collinearity of factors and different magnitudes of factors; nevertheless, for factorization of an N-D tensor of size I1 × · · · × IN with rank R, the algorithm is computationally demanding due to construction of large approximate Hessian of size (RT × RT ) and its inversion where T = n...
متن کامل